Fundamentals of Real Time Modeling

Introduction

What is Interactive Real Time

Real Time Graphics Hardware Limitations

Image Management Techniques

MultiGen Solves Developers Problems


Introduction. This short paper addresses the problems and techniques used within any 3D computer generated visual based interactive simulation. It is designed to introduce sales personnel to the challenges of real time 3D graphics. Any comments on the details of this paper please contact:

Carl Suttle
MultiGen Product Manager
MultiGen Inc.
550 South Winchester Boulevard
Suite 500
San Jose, CA 95128

Tel: 408-261-4100
Fax: 408-261-4101
Email: csuttle@multigen.com

Background. MultiGen Inc. was established in 1981 to provide consulting services to the simulation industry. Today, the 50-person company focuses exclusively on the design and development of interactive, graphical modeling tools for 3D visual simulation. Its tools are used to create the scenes used in real time applications such as flight and driving simulation, location-based entertainment, virtual reality, games, CAD visualization, and architectural walk-through. Development of the MultiGen software began in 1985, with the first delivery to NASA/Ames Research Center in 1986. Systems for numerous image generator formats have been developed, including OpenFlightâ„¢. This format has been used by several commercial organizations that develop real time software on Silicon Graphics platforms, and is the primary format used by SGI Performer. Significant company resources are devoted to research and development of new options, enhancements, and compatibility with additional visual systems. The company is committed to producing products with the best price/functionality ratio available from any commercial source.

 

  • MultiGen polygonal and real time attribute modeler
  • Texture Editor and application tools
  • Automatic BSP Generation
  • A Disk Based Output Format, .flt

 

In order to understand the value of MultiGen and how it provides the graphics modeling component of an interactive developers toolkit we need to understand the key elements of a complete real time interactive application.

We assume the developer requires an attractive interactive simulation with 3D graphics being used as the main output medium.

To understand the techniques that can be used we first need to understand what interactive real time is and the constraints placed upon the application. We will then go on to illustrate image management techniques(or the tricks used) to help 3 D graphics systems maintain real time.

Interactive Real Time. In order to produce the illusion of smooth continuous movement a human player requires that visual and other stimulus be updated at regular intervals. In the visual domain this rate must be in excess of 17Hz, this figure varies for individual applications and players.

This illusion is very desirable but smooth continuous movement is also provided by video or movie film (at 24 Hz update rate), these items are not normally referred to as interactive.

To be the interactive the human player must be able to influence the application in a natural way, the time delay from user input until application output response, otherwise known as latency, is critical. Like the update rate for smooth movement the acceptable latency is dependent upon the application and needs to be constant.

We can say that a real time interactive application will complete all of its major components; inputs, application computation, and outputs in a given, fixed, time period.

Inputs. In our case the inputs will be the reading of control positions and states from a keyboard, game controller, or simulator controls.

Application. The component that controls the simulation. The inputs are used to determine the actions of the player and other elements within the simulation. The application will provide a simulation environment in which to play, goals to be achieved and elements of competition and opposition which are the essence of the simulation. Upon the completion of the application computation the positions of the players, competition, opposition, score, status, changes in the simulation environment, viewing position, special effects, audio etc. are made available for output processing.

Outputs. The items detailed above are used to control the visual and audio rendering of the application. Depending on the application further outputs may be used to control the position of a motion platform and other physical stimulation devices.

An interactive simulation is a closed loop system, the final important component is the-- human player. His/her perception of smooth continuous movement of visual, aural, other stimulus and natural control are the measure of the success of real time. In order to achieve this success the latency of the whole system must be constant and appropriate to the application.

To maintain the illusion of continuous movement the visual output and the objects in the scene must be regularly updated. The time taken from one update to the next is called the iteration period.

Referring to figure 1 we can illustrate the timing diagram of an application that exhibits interactive real time where the iteration rate and latency of the application have been defined in terms of the CRT refresh rate. Real time applications will complete the essential elements of input, application and output within a given time period. For the application detailed the latency is approximately 100msec with an iteration rate of 33 msec.

Why 30 Hz, why maintain real time? 30 Hz is the frame rate of the simulation application output device, a CRT. Most applications synchronize their iteration rate to a function of the CRT frame rate to avoid unacceptable noise and flicker of the display. This rate is standard for any country and not under the game developers control.

Any component of the whole system that does not maintain real time can easily be seen by the user as a glitch or simulator control anomaly. The effects of not being real time are typified below:

If the controls are not "read" on time then any control laws will be invalid and control lag will be evident. This mostly results in over control by the simulator user and can make acquiring a target or smooth control impossible.

If the computational application cannot complete its essential elements within the given time and the outputs "miss a frame" the sim will slow down. The illusion of smooth controlled movement is destroyed and the player must adjust his technique to account for a changed movement flow. This can easily be seen in applications that when processing complex parts of an application consistently slow down to half of the target iteration rate.

Essential elements of the application will probably include the calculation of the new position and orientation of many objects in the scene. Non-essential components are those items that naturally occur at a slower rate, maybe fuel levels or the calculation of the current score.

If the output devices controlling visual rendering cannot complete their processes within the given time frame, or many subsequent frames, will be dropped. Just dropping one frame visually can destroy the illusion and fluidity of the game resulting in inconsistent control laws and game playing technique.

The need to maintain real time within an interactive entertainment application is well understood and real time can be seen in most 2D and 3D applications today. However 3D real time graphics probably represents the largest technical challenge for many developers to master. Fortunately many years of work have been completed in defense and research communities, which is directly transferable to interactive entertainment.

We will now go on to detail the limitations of 3 D graphics systems and some of the more common techniques available to ensure that the graphics system maintains real time.

Real Time Graphics Hardware Limitations. Today state of the art 3D graphics hardware has higher performance than ever before. Data transfer and processing limits will always exist that define the quality of the scenes that the 3D graphics system can produce.

State of the art real time graphics hardware is designed to draw polygons, usually triangle and quad, primitives as fast as possible. There is a limit as to how fast these primitives can be translated, scaled and rotated by the graphics system. There is therefore a Polygon Limit.

Each of the polygons drawn has to be filled to represent a solid surface. The size and number of the polygons in the scene can reach such a number that a Fill Limit is reached. Too many polygons, polygons covering a large area of the screen or many polygons, one behind the other can cause the fill limit to be reached.

Texture mapping, The application of noise and other patterns or photographs to polygons is a popular technique for providing detail in a polygonal scene without requiring more polygons to be drawn. Components of the texture, Texels, must be mapped onto the polygons . Texture mapping also has its physical limits in terms of Texel Limit.

Hidden Surface Removal. Generally all hidden surfaces in a scene must be removed prior to transfer to the screen frame buffer. Many techniques are available to ensure that hidden surfaces are not seen in the final image. Depending on the number of polygons and the scene to be processed the hidden surface removal will take more or less processing. Range Sorting, Binary Separating Planes and Z Buffer process are common examples of hidden surface removal techniques. All these methods have advantages and disadvantages dependent upon the application. For instance the Z buffer usually is inefficient when processing many polygons, one behind the other. Applications that exhibit this problem are said to have a high depth complexity, a problem usually found in ground simulation applications

The penalty for exceeding any of these limits is that the new scene cannot be completed in time for the next transfer into screen memory. Thus the 3D graphics system misses an update and a is frame is dropped. The illusion of smooth movement is destroyed.

Further, the load on the 3D graphics system depends largely on the viewpoint and the objects being displayed. These items change relatively slowly even in the fastest game. It is likely therefore that the system will continue to drop frames for a lengthy period of time, the iteration rate being effectively halved or worse until the loading is reduced within limits.

It would be possible to render objects with very low polygon counts such that in any event the above limits are never exceeded. These scenes would be extremely simple and not as attractive as can be obtained by applying some image management techniques, some of the more common techniques are described below.

Image Management Techniques.

Culling

Object Culling. The most simple technique. If an object, a geographically compact collection of polygons, is not going to be displayed anywhere within the viewing frustum then do not consider drawing it. A test, usually carried out by the system CPU determines if an object should be culled. If any part of a bounding volume that completely surrounds the object is not within the viewing frustum then the object can be culled, no further processing of the object is required.

The viewing frustum is defined by the Near and Far Clipping Planes, and the Field of View. Field of view is self explanatory and is defined both vertically and horizontally. In any 3D scene it is generally possible to decide that objects closer than and further than a certain distance from the viewer are not required to be drawn. These distances define the near and far clipping planes.

Level of Detail. The polygon budget per frame is very easily used up. (Even a very expensive workstation may only avarage 3000 polygons per full resolution 30 Hz frame in a real application). If an object is a relatively a large distance from the user then it is probable that the viewer cannot see all of the detail included in the model. The substitution of this complex model for one constructed with fewer polygons will have little effect on the visual quality but will save a substantial portion of the polygon budget for other purposes. The substitution of polygonaly complex models for simpler models with less polygon density based on distance is called level of detail switching.

Back Face Culling. Usually polygons are defined as having only one side. If any polygons are backfacing then they cannot be seen and are therefore not considered further by the graphics system. On occasions especially with low performance requirements an object may be one polygon thick. In this case the polygon needs to be marked as a "2 sided"so that the graphics system knows how to handle the rendering. Back Face Culling typically saves drawing half the polygons in a given scene.

Hidden Surface Removal

Range Sorting. In many cases today low cost 3D graphics hardware will have no hardware to perform hidden surface removal. In this case some method of sorting the drawing order of the polygons must be provided, the polygons to be displayed will be ordered such that the graphics system will draw polygons in order from the back to the front of the object providing hidden surface removal.

Painters Algorithm. A simplified version of the depth sort algorithm often known as the "painters algorithm". This technique draws an object in the order a painter might chose to paint closer polygons over more distant ones. In this case the drawing order of the polygons can be hard coded within the model.

Binary Separating Planes (BSP) If an object encloses some volume and is generally convex or is a terrain skin (like most of the objects that we wish to draw) we can use BSPs to decide the drawing order at an object level. We can separate objects, or parts of an object, in the scene before run time processing by a plane or by multiple planes. These separating planes are used by the graphics system to define in which order the objects should be drawn. The runtime algorithm that uses the BSPs is very efficient thus is used widely in image generation. Note that extra modeling effort and skill is required to insert the separating planes within the model.

Drawing Order and Transparency. In order to see an object through a transparent object it is necessary to draw both objects, the transparent object being drawn last. The drawing order of the objects and the polygons that make up the objects can be hard coded within the model.

Z Buffer. No object or polygon sorting is required to use a Z buffer. This technique compares each pixel in each polygon in the scene for its distance from the viewer. The closest pixel to the viewer is the one actually drawn or rendered. The Z buffer algorithm whether implemented in software or hardware can be a time intensive task that usually has a performance cost. In hardware terms the size of memory required to implement a Z Buffer is relatively high. Note that in contrast models rendered using this technique require no separating planes or polygonal drawing order (except for in the case of transparency) and are therefore easier and simpler to model.

Using the above techniques we can decide to use the graphics engine to only draw the polygons and textures that are relevant to the viewer. The polygon budget is wisely used. However, the graphics system can only apply these techniques if the information is pre-coded into the model databases. How do we do this?

MultiGen solves the customers problems now and in the future.

We have already split the graphics system into two main components. The polygonal models or databases to be rendered and the rendering system that renders the databases from a random viewpoint. To maintain real time today graphics systems make significant use of special attributes contained within real time visual databases.

The task of the developer of the graphics component of the game is to create attractive model databases that run in real time. MultiGen enables the developer to:

  • Create and edit attractive models and animation
  • Create extensive terrain and virtual world environments.
  • Add all of the attributes required for real time image generation.
    • Bounding boxes
    • Level of Detail
    • Seperating Planes
    • Fixed Order
    • And more
  • Test and edit all of the above interactively in a user friendly environment.

The database modeler will create models in line with the simulator specifications.

Real time attributes are added to all models during database creationi.e. bounding volumes calculated, lower levels of detail and switching distances modeled. The modeling done within MultiGen should be completed with the type and parameters of the real time system in mind.

The model and terrain databases are now stored in a defined format on disk or tape for distribution to the target system.

Multiple Target Platforms. Having completed the database work we must now display them on the target machine.

Dependent on the application, hardware and software we can expect that each individual manufacturer of graphics system to use any number of the general techniques detailed in this paper. MultiGen users are able to efficiently create and edit models that contain a superset of the functionality required to support almost any individual graphics system. Therefore the modeler must be knowledgeable and disciplined enough to use only those attributes available to him.

The .flt format stored on disk contains information not only for any graphics system also for the modeling system and thus is intended to be an intermediate format, not the final runtime format.

The graphics system must load the database, parse unnecessary information and if necessary complete some re-organization as part of a loader process before any models can be rendered in real time. This loader and the graphics system are application specific and so not provided as part of MultiGen.

Certain loaders and graphics software have been completed and are available. Performer is available from SGI which may be used in conjunction with a loader provided free in source or object form with MultiGen. Performer allows the visual scenes to be previewed on Silicon Graphics workstations in real time.

Loaders for other graphics systems are not yet available, however "general case" source code called "fly_mg" can be provided to customers upon request from MultiGen Inc. Fly_mg is a simple graphics system for Silicon Graphics hardware. Written in "c" and using Open GL it is available for the customer to use directly, port, or as a template for his application.

MultiGen Inc has more experience with multiple target platforms than any other company in the world. (MultiGen is short for Multiple Image Generator). MultiGen products are used on over 25 different types of image generators. If any questions arise about this subject, and you need further help, then please contact your MultiGen contact.

 


MultiGen is a registered trademark and GameGen and SmartScene are trademarks of MultiGen, Inc. All othertrademarks mentioned herein are property of their respective companies.

Copyright 1997 MultiGen Inc.